专利摘要:
The invention relates to a method for transferring data relating to a preoperative or operation-related plan to a mixed-reality device, such that it can be used in dental and medical treatments, comprising the steps of: acquiring a diagnostic X-ray exam; digitalising the anatomical shape; overlaying previously obtained radiographic and volumetric records; planning the treatment with treatment planning software; recognising and identifying the anatomical shape of the patient via the mixed-reality device; transferring records to be displayed to the mixed-reality device, same being uploaded onto said device; aligning the images obtained from the treatment plan with the treatment planning software and the data and records identified by the recognition software; and displaying images on the mixed-reality device.
公开号:ES2804910A2
申请号:ES202090056
申请日:2018-05-30
公开日:2021-02-09
发明作者:Macho Alvaro Zubizarreta;Ufano Daniel Ortega;Galindo Ana Belén Lobo;Romano Cristina Rico;Alvarez Jesús Mena;Ezpeleta Luis Oscar Alonso;Sindreu Fernando Durán;García María Jesús Suárez;Suárez Carlos López;Panadero Rubén Agustín;Dávila Santiago Berrendero;Centenera María Belén Centenera;Rico Jesús Peláez;Navarro Alberto Ferreiroa;Dominguez Gonzalo Alberto Bajo;Noval Beatriz Vizoso;Sola Santiago Poc;Arribas Celia Tobar;Deglow Elena Riad;Sans Francesc Abella
申请人:Macho Alvaro Zubizarreta;Ufano Daniel Ortega;
IPC主号:
专利说明:

[0002] PROCEDURE TO TRANSFER A THERAPEUTIC PLANNING TO A MIXED REALITY DEVICE
[0004] OBJECT OF THE INVENTION
[0006] The Invention refers to a procedure that makes it possible to transfer a perioperative therapeutic plan or any of its parts to a mixed reality (MRI) device; that allows to attend dental and medical treatments in a safe, conservative and predictable way, as well as to establish its application in the educational field.
[0007] The application of the present invention focuses on dental practice, more specifically on the devices and means that facilitate the work of professionals in the sector.
[0009] BACKGROUND
[0011] Currently, the preoperative planning of dental procedures is carried out through the analysis of clinical and radiodiagnostic tests. However, the transfer of these data to the stomatognathic system of the patient under study is complex, and an error of interpretation can lead to the appearance of intraoperative complications of varying severity.
[0013] The appearance of computer-assisted techniques, adjuncts to the conventional clinical procedure, increases the precision of the surgical act, improving the therapeutic prognosis; however, it implies the realization of drilling guides that direct the direction of the surgical instruments, raising the cost of treatment. Document ES2575509B1 describes a drill guide model for the placement of dental implants that allows controlling the direction and depth of surgical instruments during use. Document ES2612308A2 describes a drill guide model, as well as the procedure for planning the surgical procedure and / or prosthetic rehabilitation of dental implants. These milling assist devices feature limitations regarding their precision, since they present an average angular deviation of 3.81 °, an average horizontal deviation in its outermost part of 0.99mm and in its innermost part of 1.24mm (Sicilia A, Botticelli D, Working Group 3. Computer-guided implant therapy and soft-and hard-tissue aspects. The third EAO Consensus Conference 2012. Clin Oral Implants Res. 2012; 23 (6): 157 61.). The lack of precision inherent in these devices can lead to intraoperative complications of varying severity.
[0015] Image-guided surgical navigation systems have improved the degree of precision of the previous techniques; allowing more predictable and conservative surgical procedures, as well as favoring immediate prosthetic rehabilitation subsequent to the placement of dental implants. Document WO02 / 096261A2 describes an image-guided passive surgical navigation device and system, capable of tracking the position and spatial orientation of surgical instruments during dental implant placement and directing their direction in real time. Document US2017 / 0053437A1 describes a device and a method to allow the positioning of a navigation system and reproduce the virtual planning on an augmented reality device. However, these systems are limited to surgical treatments, are associated with a high economic cost and require a high learning curve. The degree of precision of the present invention allows its application in all types of dental treatments: surgical procedures, implant surgery, anesthetic procedures, joint treatments, endodontics, orthodontics, periodontics, dental aesthetics and dental prostheses. It can also be applied to other medical disciplines; such as general surgery, plastic surgery, podiatry, ophthalmology, cardiology, traumatology or neurology.
[0017] MRI mixed reality devices, understanding this type of device as those with augmented reality and virtual reality, have the ability to superimpose a virtual image on a real scene; so that the user acquires the sensation that the virtual image is part of reality. These devices are generally made up of the following elements:
[0018] - Camera: Element in charge of capturing the images of the scene.
[0019] - Processor: Element in charge of interpreting the captured image and generating the virtual image in order to superimpose it on the real scene.
[0020] - Marker: Element that helps to position the images generated by the processor.
[0021] - Triggers: Set of devices that calculate the position of the augmented reality device, such as a compass, an accelerometer or the GPS.
[0022] The need to find an economical, precise and multidisciplinary alternative that avoids the risks attributed to current procedures is evident, something that is not possible to obtain with the devices and methodologies existing today.
[0024] The present invention has the objective of providing a procedure that allows the transfer of a preoperative therapeutic planning or any of its parts, using among other computer means, to a mixed reality device, which allows its visualization superimposed on the anatomical orography of the patient in real time .
[0026] DESCRIPTION OF THE INVENTION
[0028] The procedure object of the present invention comprises the achievement of a series of stages with which it is possible to transfer a peroperative therapeutic planning or any of its parts to a mixed reality device, and where said stages, in a general way, are detailed below. continuation:
[0030] 1) Acquisition of a radiodiagnostic examination (computerized axial tomography (CT), cone beam computed tomography (CBCT), magnetic resonance imaging (MRI), orthopantomography (OPG), lateral skull radiography, conventional radiography, digital or similar) of the tissues involved in the therapeutic procedure and subsequent storage in digital format (DICOM files (.dcm)). This file is obtained by exposing the patient to a radiological technique between 7-12mA, 90-120kVp, an exposure time that it ranges between 6-10 seconds and a radiological acquisition field between 60x60mm and 200x170mm; and allows the reconstruction of an anatomical volume that exceeds the facial massif. Acquisition refers to radiographic evidence that can be viewed with the augmented reality device for computed tomography, cone beam computed tomography, magnetic resonance imaging, orthopantomography, lateral skull radiography, and conventional or digital intraoral radiography.
[0032] ) Generation of a digital volumetric model of the tissues involved in the therapeutic procedure (stereolithographic file (.stl)). This can be obtained from:
[0034] a) Digitization of the tissues involved in the therapeutic procedure using an intraoral scanner.
[0036] b) Digitization of a conventional impression of the tissues involved in the therapeutic procedure using an extraoral scanner.
[0038] c) Digitization of a working model of the tissues involved in the therapeutic procedure using an extraoral scanner.
[0040] The acquisition of the digital volumetric model of the tissues involved in the therapeutic procedure can be obtained by:
[0042] 1. An optical sensor present in the scanner, which makes multiple two-dimensional images on the patient's tissues / models, and a reproduction algorithm reproduces in a three-dimensional file.
[0043] 2. A laser optical sensor present in the scanner.
[0044] 3. A tactile sensor present in the scanner, which travels the surface of the patient's model generating a topographic map of the patient; and therefore, a three-dimensional file.
[0045] ) Superposition of the radiographic and volumetric files obtained in the two previous stages by means of a therapeutic planning program. The overlay is done manually by an application present in the therapy planning software; by aligning ("best-fit") anatomical landmarks present in both files. The alignment is done by recognizing at least one anatomical structure present and matching in both files.
[0047] ) Therapeutic planning with the therapeutic planning software, where the therapeutic procedure is planned, using computer programs and means, and through said means it is possible to export the files that will be later viewed by the mixed reality device (MR); that includes real files (such as the medical history or part of it, or radiodiagnostic tests), or virtual images obtained from the treatment planning software.
[0049] ) Recognition of the anatomical orography visualized through the mixed reality device. There are two alternatives for the sensors arranged in the augmented reality device to recognize the anatomical landmarks present in the patient:
[0051] a) The first alternative is based on the recognition of markers ("marker-based AR") determined by anatomical reference points ("image target"), present in the anatomical orography of the patient. The superposition ("matching") and orientation of the therapeutic planning or part of it on the anatomical orography of the patient, depends on the location of said markers and their recognition by one or more sensors installed in the mixed reality device. The recognition of the markers by the sensors arranged in the mixed reality device, provides information regarding the angle, distance and orientation of the markers, with respect to the operator.
[0053] b) The second alternative bases its principles on the recognition of the position of the anatomical reference points, present in the anatomical orography of the patient ("markerless-based AR"), by means of photographs taken through a camera installed in the mixed reality device. Recognition of predetermined anatomical locations Used as anatomical reference points, it allows the identification of the rest of the patient's anatomical orography. The overlap and orientation of the planning depends on an algorithm that identifies the depth, colors and other characteristics present in the obtained images.
[0055] The recognition of the anatomical reference points is carried out through a sensor installed in the mixed reality device, with the capacity to recognize and follow the anatomical features, or through a device installed in the mixed reality device without recognition capacity, for which requires a "tracker" or recognition software.
[0057] 6) Transfer of images, files or virtual images of the preoperative therapeutic planning or any of its parts, carried out by means of an application or computer software to a mixed reality device; and therefore, superimposed on the anatomical orography of the patient. The files transferred to the augmented reality device are scaled and / or oriented so that the overlap with the patient's orography is exact.
[0059] 7) Superposition of the "image targets" and the images transferred to the mixed reality device, where the alignment of the images obtained from the therapeutic planning with the therapeutic planning software are superimposed with the images identified by the recognition software.
[0060] 8) Viewing images with the mixed reality device.
[0062] In this way, the present invention provides the professional with a therapeutic assistance tool, which allows real-time observation of data and images related to the patient's clinical history, radiodiagnostic tests and medical examinations. In addition, it allows projecting virtual images created through preoperative therapeutic planning software on the anatomical orography of the patient, which facilitate therapeutic procedures; prior identification and recognition of anatomical landmarks present in the patient's anatomical orography.
[0063] In order to complete the description and to help a better understanding of the characteristics of the invention, a figure is presented in which, by way of illustration and not limitation, the following is represented:
[0065] Fig. 1 Representative diagram of the different stages, where the different stages that go from the acquisition of data to the visualization of images in the mixed reality device are shown schematically.
[0067] DETAILED DESCRIPTION OF A MODE OF IMPLEMENTATION
[0069] In a preferred embodiment of the invention, the procedure comprises the following steps:
[0071] I) Acquisition of a radiodiagnostic examination, such as CT / CBCT / MRI / OPG / Rx, which refer to radiographic evidence that can be observed with the mixed reality device.
[0072] II) Digitization of the anatomical orography.
[0073] III) Superposition of the radiographic and volumetric files obtained in the previous stages.
[0074] IV) Therapeutic planning with a therapeutic planning software, where the treatment is planned in computer programs with which virtual volumetric files are exported that will be visualized by the mixed reality device.
[0075] V) Recognition and identification of the patient's anatomical orography by the mixed reality device is carried out using recognition software that identifies certain anatomical features.
[0076] VI) Transfer of the files to be displayed on the mixed reality device, being uploaded to said device.
[0077] VII) Alignment of the images obtained from the therapeutic planning with the therapeutic planning software and the data and files identified by the recognition software.
[0078] VIII) Viewing images with the mixed reality device with the data from the previous stages.
权利要求:
Claims (7)
[1]
1.- Procedure to transfer a therapeutic plan or part of it to a mixed reality device, to be used in dental and medical treatments, which is characterized by comprising the stages of:
i) acquisition of a three-dimensional radiodiagnostic examination of the patient's tissues and subsequent storage in digital format; obtaining radiographic files;
ii) generation of a digital volumetric model of the patient's tissues by means of a scanner, obtaining a stereolithographic file;
iii) superposition and integration of the radiographic files and the sterolithographic file obtained in previous stages, in a therapeutic planning computer program by aligning the anatomical reference points present in both files;
iv) planning the dental or medical procedure on the file obtained in the previous stage, by means of a therapeutic planning computer program;
v) recognition of the anatomical orography of the patient by means of a recognition software, which by means of sensors arranged in the mixed reality device detects one or more anatomical landmarks present in the patient;
vi) transfer of real files and / or virtual volumetric files of the preoperative planning to the mixed reality device, by means of a computer program, being loaded into said device;
vii) alignment of the images obtained from the treatment planning with the treatment planning software and the data and files identified by the recognition software of the previous stages; Y viii) Projection on the screen of the mixed reality device of the images from the previous stage superimposing on the anatomical orography of the patient.
[2]
2. - Procedure to transfer a therapeutic planning to a mixed reality device, according to claim 1, characterized in that the acquisition of the files of stage i) are obtained by exposing the patient to a radiological technique between 7- 12mA, 90-120kVp, an exposure time that ranges between 6-10 seconds and a radiological acquisition field between 60x60mm and 200x170mm.
[3]
3. - Procedure to transfer a therapeutic planning to a mixed reality device, according to claim 1, characterized by the stereolithographic file of stage ii) is obtained by means of an extraoral scanner.
[4]
4. - Procedure for transferring a therapeutic planning to a mixed reality device, according to claim 1, characterized by the stereolithographic file of stage ii) is obtained by means of an intraoral scanner.
[5]
5. - Procedure to transfer a therapeutic planning to a mixed reality device, according to claim 1, characterized in that the recognition of the anatomical orography of stage v) is based on the recognition of markers determined by anatomical reference points present in the anatomical orography of the patient.
[6]
6. - Procedure to transfer a therapeutic planning to a mixed reality device, according to claim 1, characterized in that the recognition of the anatomical orography of stage v) is based on the recognition of the position of the anatomical reference points , present in the anatomical orography of the patient through photographs taken through a camera installed in the mixed reality device.
[7]
7.- Procedure to transfer a therapeutic plan to a mixed reality device, according to claim 1, characterized in that the files transferred in stage vi) are scaled and / or oriented so that the overlap with the patient's orography is accurate using computer software.
类似技术:
公开号 | 公开日 | 专利标题
CN107847278B|2020-11-10|Targeting system for providing visualization of a trajectory for a medical instrument
US7203277B2|2007-04-10|Visualization device and method for combined patient and object image data
ES2199737T3|2004-03-01|PROCEDURE AND DEVICE FOR ODONTOLOGICAL TREATMENT HELPED WITH NAVIGATION.
Xiaojun et al.2009|Image guided oral implantology and its application in the placement of zygoma implants
Hong et al.2008|Medical navigation system for otologic surgery based on hybrid registration and virtual intraoperative computed tomography
TWI396523B|2013-05-21|System for facilitating dental diagnosis and treatment planning on a cast model and method used thereof
JP2021529618A|2021-11-04|Augmented reality-guided surgery methods and systems
US20210338107A1|2021-11-04|Systems, devices and methods for enhancing operative accuracy using inertial measurement units
ES2808210T3|2021-02-25|Dynamic dental arch map
US20170143445A1|2017-05-25|Method and apparatus for operating a dental diagnostic image generation system
JP2018079309A|2018-05-24|Head registration using personalized gripper
Tsuji et al.2006|A new navigation system based on cephalograms and dental casts for oral and maxillofacial surgery
Wagner et al.1999|Clinical experience with interactive teleconsultation and teleassistance in craniomaxillofacial surgical procedures
KR102105974B1|2020-04-29|Medical imaging system
Kim et al.2013|Direct and continuous localization of anatomical landmarks for image-guided orthognathic surgery
Verma et al.2017|Virtual Preoperative Planning and Intraoperative Navigation in Facial Prosthetic Reconstruction: A Technical Note.
ES2804910A2|2021-02-09|Method for transferring a treatment plan to a mixed-reality device
JP6542069B2|2019-07-10|Marking the fluoroscope field of view
Eggers2018|Image-guided surgical navigation
ES2639175T3|2017-10-25|Three-dimensional body
Rana et al.2017|Computer-Assisted Head and Neck Oncologic Surgery
US20220031421A1|2022-02-03|System and method for dynamic augmented reality imaging of an antomical site
Comlekciler et al.2014|Measuring the optimum lux value for more accurate measurement of stereo vision systems in operating room of Orthognathic surgery
Al-Khaled et al.2021|A Review of Augmented Reality Systems and an Update on The Most Recent Technological Developments and Applications in Dentistry
Gellrich et al.2018|Navigation and Computer-Assisted Craniomaxillofacial Surgery
同族专利:
公开号 | 公开日
WO2019229275A1|2019-12-05|
ES2804910R2|2021-09-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US20180140362A1|2015-04-07|2018-05-24|King Abdullah University Of Science And Technology|Method, apparatus, and system for utilizing augmented reality to improve surgery|
US9861446B2|2016-03-12|2018-01-09|Philipp K. Lang|Devices and methods for surgery|
US20180078316A1|2016-09-22|2018-03-22|Medtronic Navigation, Inc.|System for Guided Procedures|
US10660728B2|2016-10-20|2020-05-26|Baliram Maraj|Systems and methods for dental treatment utilizing mixed reality and deep learning|
法律状态:
2021-02-09| BA2A| Patent application published|Ref document number: 2804910 Country of ref document: ES Kind code of ref document: A2 Effective date: 20210209 |
2021-09-22| EC2A| Search report published|Ref document number: 2804910 Country of ref document: ES Kind code of ref document: R2 Effective date: 20210915 |
2022-01-13| FA2A| Application withdrawn|Effective date: 20220107 |
优先权:
申请号 | 申请日 | 专利标题
PCT/ES2018/070391|WO2019229275A1|2018-05-30|2018-05-30|Method for transferring a treatment plan to a mixed-reality device|
[返回顶部]